Samsung will supply 10,000 HBM3E 12-layer memory chips under its agreement with Nvidia. According to the latest information, preparations have begun for the shipment of these high-performance chips, which will be used in Nvidia’s AI accelerators.
Nvidia Makes Massive Chip Purchase from Samsung
In recent months, rumors surfaced that Samsung’s HBM3E memory might finally meet Nvidia’s quality standards. Following this development, the scale at which Samsung would supply Nvidia was a matter of curiosity. Now, the exact quantity has been confirmed: the 10,000 chip supply is seen as a significant step toward re-enhancing Samsung’s competitive position in the HBM market.
The HBM3E chips to be supplied are believed to be from Samsung’s 36GB capacity, 12-layer HBM3E-12H series. This model stands out as the most advanced HBM chip currently produced by Samsung. Its bandwidth, reaching 1,280 GB/s, makes this chip particularly ideal for AI applications.
The rapid growth in the AI sector has led technology giants to invest in large-scale server infrastructures. AI accelerators developed by Nvidia and AMD form the foundation of these systems.
These accelerators, however, can only work with high-performance HBM memory, which is available only from select manufacturers. Micron, SK Hynix, and Samsung are the three main suppliers with production capacity in this area.
In recent years, Nvidia has primarily worked with SK Hynix and Micron for HBM supply. SK Hynix had established a market leadership position, particularly with its HBM solutions used in Nvidia’s high-end accelerators such as the GH200 and H100. Samsung, however, was long excluded due to its inability to meet Nvidia’s quality requirements.
However, this situation appears to have changed with the HBM3E 12H. Samsung’s approval by Nvidia after a rigorous testing and development process shows that the company has re-entered the technology race in this field.
{{user}} {{datetime}}
{{text}}